Optimal vs. classical linear dimension reduction

نویسنده

  • Claus Weihs
چکیده

We describe a computer intensive method for linear dimension re duction which minimizes the classi cation error directly Simulated annealing Bohachevsky et al is used to solve this problem The classi cation error is determined by an exact integration We avoid distance or scatter measures which are only surrogates to circumvent the classi cation error Simulations in two dimensions and analytical approximations demonstrate the superiority of optimal classi cation opposite to the classical procedures We compare our pro cedure to the well known canonical discriminant analysis homoscedastic case as described in Mc Lachlan and to a method by Young et al for the heteroscedastic case Special emphasis is put on the case when the distance based methods collapse The computer intensive algorithm always achieves minimal classi cation error Introduction Classi cation deals with the allocation of objects to g predetermined groups G f gg say The goal is to minimize the misclassi cation rate over all possible future allocations characterized by the conditional densities pi x i g The minimal error is the so called Bayes error Mc Lachlan Often we want to reduce the dimension of the classi cation problem to one or two dimensions in order to support human imagination without signi cantly increasing the misclassi cation rate This article deals with linear combinations of the original variables to achieve this goal Lin ear Dimension Reduction The next section reviews the classical approach based on distance measures and presents the idea of Young et al in a way that facilitates such a distance formulation Section introduces computerintensive dimension reduction and simulated annealing Section compares the classical and the computerintensive method Classical Linear Dimension Reduction The intuitive idea is to project the data in a way that maximizes the distance between the groups hopefully this will also minimize the misclassi cation rate The distance measure relates the between group scatter matrix

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Direct Minimization of Error Rates

We propose a computer intensive method for linear dimension reduction which minimizes the classiication error directly. Simulated annealing (Bohachevsky et al. 1986) as a modern optimization technique is used to solve this problem eeectively. This approach easily allows to incorporate user requests by means of penalty terms. Simulations demonstrate the superiority of optimal classiication to cl...

متن کامل

Facial expression recognition based on Local Binary Patterns

Classical LBP such as complexity and high dimensions of feature vectors that make it necessary to apply dimension reduction processes. In this paper, we introduce an improved LBP algorithm to solve these problems that utilizes Fast PCA algorithm for reduction of vector dimensions of extracted features. In other words, proffer method (Fast PCA+LBP) is an improved LBP algorithm that is extracted ...

متن کامل

Fast Linear Discriminant Analysis using QR Decomposition and Regularization

Linear Discriminant Analysis (LDA) is among the most optimal dimension reduction methods for classification, which provides a high degree of class separability for numerous applications from science and engineering. However, problems arise with this classical method when one or both of the scatter matrices is singular. Singular scatter matrices are not unusual in many applications, especially f...

متن کامل

Dimension reduction based on extreme dependence

We introduce a dimension reduction technique based on extreme observations. The classical assumption of a linear model for the distribution of a random vector is replaced by the weaker assumption of a fairly general model for the copula. We assume an elliptical copula to describe the extreme dependence structure, which preserves a ’correlation-like’ structure in the extremes. Based on the tail ...

متن کامل

Neighborhood Preserving Projections (NPP): A Novel Linear Dimension Reduction Method

Dimension reduction is a crucial step for pattern recognition and information retrieval tasks to overcome the curse of dimensionality. In this paper a novel unsupervised linear dimension reduction method, Neighborhood Preserving Projections (NPP), is proposed. In contrast to traditional linear dimension reduction method, such as principal component analysis (PCA), the proposed method has good n...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999